What are 'Cheapfake' Videos?
2020-02-15
LRC
TXT
大字
小字
滚动
全页
1An edited video of U.S. House Speaker Nancy Pelosi has raised questions about social media and political campaigns in the United States.
2The video was produced last week after President Donald Trump's State of the Union speech.
3The video shows Pelosi repeatedly tearing up a printed copy of the speech while the president spoke.
4Trump posted the edited video on Twitter.
5Pelosi did tear the pages of her copy of the speech.
6But she did so only after Trump finished speaking - not during the address as the video shows.
7Pelosi's office asked Twitter and Facebook to take down the video. Both companies refused to do so.
8Researchers worry the video's "selective editing" could mislead people.
9"Selective editing" is a term for editing or changing videos in a way that does not show what really happened.
10Such recordings are sometimes called "cheapfake" videos.
11Researchers fear the number of "cheapfakes" could increase if social media companies do not identify or make rules about such videos.
12The United States has a long history of political candidates showing their opponents in a negative light.
13Thomas Jefferson and John Adams attacked each other in newspaper advertisements over 200 years ago.
14In 1960, John F. Kennedy's campaign used an ad showing different images of Richard Nixon sweating and looking weak.
15In some ways, the edited video of House Speaker Pelosi is not unusual.
16What is different now, says Clifford Lampe, is how widely such videos can spread in such a short time.
17Lampe is a professor of information at the University of Michigan.
18"The difference now is that the campaigns themselves, the president of (the) U.S. himself, is able to disseminate these pieces of media to the public," he said.
19Lampe added that political candidates "no longer" have to work "with media outlets."
20Facebook, Google and Twitter have reported on their efforts reduce disinformation on their services.
21The hope is to avoid some of the backlash created by social media misinformation during the U.S. elections four years ago.
22But the video of Pelosi does not violate existing policies, both Twitter and Facebook said.
23Facebook has rules that ban what are known as "deepfake" videos.
24Such videos use artificial intelligence, or AI, technology to make it seem like someone "said words that they did not actually say."
25Researchers say the Pelosi video is a "cheapfake" video, a video that has been changed without the use of AI.
26Cheapfakes are much easier to create and are more common than deepfakes, notes Samuel Woolley.
27He is director of propaganda research at the Center for Media Engagement at the University of Texas.
28The Pelosi video is "deliberately designed to mislead and lie to the American people," Pelosi deputy chief of staff Drew Hammill tweeted last week.
29He criticized Facebook and Twitter for not taking down the video from the social media services.
30Andy Stone, who works for Facebook, reacted to Hammill's comments on Twitter.
31Stone wrote, "Sorry, are you suggesting the President didn't make those remarks and the Speaker didn't rip the speech?"
32Speaking with The Associated Press, Stone confirmed that the video did not violate the company's policy.
33In order to be taken down, the video would have had to have been created with newer, more advanced technology.
34It would have also possibly tried to show Pelosi saying words that she did not say.
35Twitter did not remove the video either.
36It pointed toward a blog post that says the company plans to start identifying tweets that contain "synthetic and manipulated media."
37The new policy will take effect on March 5.
38U.S. law does not say much about cheapfakes. Social media companies generally police their own websites.
39A law, section 230 of the Communication Decency Act, protects technology companies from most legal action related to the information posted on their sites.
40Most social media companies now ban violent videos and videos that could cause real-world harm.
41In recent years, Facebook, Twitter and Google's YouTube have received criticism about offensive videos that have appeared on their sites.
42The companies sometimes remove the videos.
43 Other times, they leave the videos on their sites, pointing to the right to freedom of expression.
44The future of misinformation through social media is unclear.
45Jennifer Grygiel, an assistant professor at Syracuse University, called for laws to better govern social media in cases of political propaganda.
46However, this proposal has some weaknesses, she admits.
47One difficulty is that the "very people who will be regulating them [social media sites] are the same ones using them [social media sites] to get elected."
48I'm John Russell.
49And I'm Ashley Thompson.
1An edited video of U.S. House Speaker Nancy Pelosi has raised questions about social media and political campaigns in the United States. 2The video was produced last week after President Donald Trump's State of the Union speech. The video shows Pelosi repeatedly tearing up a printed copy of the speech while the president spoke. 3Trump posted the edited video on Twitter. 4Pelosi did tear the pages of her copy of the speech. But she did so only after Trump finished speaking - not during the address as the video shows. 5Pelosi's office asked Twitter and Facebook to take down the video. Both companies refused to do so. 6Researchers worry the video's "selective editing" could mislead people. 7"Selective editing" is a term for editing or changing videos in a way that does not show what really happened. Such recordings are sometimes called "cheapfake" videos. 8Researchers fear the number of "cheapfakes" could increase if social media companies do not identify or make rules about such videos. 9The United States has a long history of political candidates showing their opponents in a negative light. Thomas Jefferson and John Adams attacked each other in newspaper advertisements over 200 years ago. 10In 1960, John F. Kennedy's campaign used an ad showing different images of Richard Nixon sweating and looking weak. 11In some ways, the edited video of House Speaker Pelosi is not unusual. What is different now, says Clifford Lampe, is how widely such videos can spread in such a short time. Lampe is a professor of information at the University of Michigan. 12"The difference now is that the campaigns themselves, the president of (the) U.S. himself, is able to disseminate these pieces of media to the public," he said. Lampe added that political candidates "no longer" have to work "with media outlets." 13Facebook, Google and Twitter have reported on their efforts reduce disinformation on their services. The hope is to avoid some of the backlash created by social media misinformation during the U.S. elections four years ago. 14But the video of Pelosi does not violate existing policies, both Twitter and Facebook said. 15Facebook has rules that ban what are known as "deepfake" videos. Such videos use artificial intelligence, or AI, technology to make it seem like someone "said words that they did not actually say." 16Researchers say the Pelosi video is a "cheapfake" video, a video that has been changed without the use of AI. 17Cheapfakes are much easier to create and are more common than deepfakes, notes Samuel Woolley. He is director of propaganda research at the Center for Media Engagement at the University of Texas. 18The Pelosi video is "deliberately designed to mislead and lie to the American people," Pelosi deputy chief of staff Drew Hammill tweeted last week. He criticized Facebook and Twitter for not taking down the video from the social media services. 19Andy Stone, who works for Facebook, reacted to Hammill's comments on Twitter. Stone wrote, "Sorry, are you suggesting the President didn't make those remarks and the Speaker didn't rip the speech?" 20Speaking with The Associated Press, Stone confirmed that the video did not violate the company's policy. In order to be taken down, the video would have had to have been created with newer, more advanced technology. It would have also possibly tried to show Pelosi saying words that she did not say. 21Twitter did not remove the video either. It pointed toward a blog post that says the company plans to start identifying tweets that contain "synthetic and manipulated media." The new policy will take effect on March 5. 22U.S. law does not say much about cheapfakes. Social media companies generally police their own websites. 23A law, section 230 of the Communication Decency Act, protects technology companies from most legal action related to the information posted on their sites. 24Most social media companies now ban violent videos and videos that could cause real-world harm. In recent years, Facebook, Twitter and Google's YouTube have received criticism about offensive videos that have appeared on their sites. The companies sometimes remove the videos. Other times, they leave the videos on their sites, pointing to the right to freedom of expression. 25The future of misinformation through social media is unclear. Jennifer Grygiel, an assistant professor at Syracuse University, called for laws to better govern social media in cases of political propaganda. 26However, this proposal has some weaknesses, she admits. One difficulty is that the "very people who will be regulating them [social media sites] are the same ones using them [social media sites] to get elected." 27I'm John Russell. 28And I'm Ashley Thompson. 29Rachel Lerman reported on this story for The Associated Press. John Russell adapted it for VOA Learning English. George Grow was the editor. 30________________________________________________________ 31Words in This Story 32edit - v. to change or amend something for publication 33page - n. a piece of paper 34negative - adj. showing or talking about the bad qualities of someone or something 35sweat - v. to release small, wet droplets from the skin 36disseminate - v. to cause (something, such as information) to go to many people 37backlash - n. a strong public reaction against something 38deliberately - adv. in a way that is meant or planned 39chief of staff - n. the top officer of a service 40synthetic - adj. not real; related to a copy of a natural product; untrue or false 41manipulate - v. to operate in a skillful way; to control of influence something unfairly 42We want to hear from you. Write to us in the Comments Section.